Unconditional Generative Models
نویسندگان
چکیده
Deep generative neural networks have proven effective at both conditional and unconditional modeling of complex data distributions. Conditional generation enables interactive control, but creating new controls often requires expensive retraining. In this paper, we develop a method to condition generation without retraining the model. By post-hoc learning latent constraints, value functions that identify regions in latent space that generate outputs with desired attributes, we can conditionally sample from these regions with gradient-based optimization or amortized actor functions. Combining attribute constraints with a universal “realism” constraint, which enforces similarity to the data distribution, we generate realistic conditional images from an unconditional variational autoencoder. Further, using gradient-based optimization, we demonstrate identity-preserving transformations that make the minimal adjustment in latent space to modify the attributes of an image. Finally, with discrete sequences of musical notes, we demonstrate zero-shot conditional generation, learning latent constraints in the absence of labeled data or a differentiable reward function.
منابع مشابه
Data Generation as Sequential Decision Making
We connect a broad class of generative models through their shared reliance on sequential decision making. Motivated by this view, we develop extensions to an existing model, and then explore the idea further in the context of data imputation – perhaps the simplest setting in which to investigate the relation between unconditional and conditional generative modelling. We formulate data imputati...
متن کاملOne-Shot Generalization in Deep Generative Models
Humans have an impressive ability to reason about new concepts and experiences from just a single example. In particular, humans have an ability for one-shot generalization: an ability to encounter a new concept, understand its structure, and then be able to generate compelling alternative variations of the concept. We develop machine learning systems with this important capacity by developing ...
متن کاملLatent Constraints: Learning to Generate Conditionally from Unconditional Generative Models
Deep generative neural networks have proven effective at both conditional and unconditional modeling of complex data distributions. Conditional generation enables interactive control, but creating new controls often requires expensive retraining. In this paper, we develop a method to condition generation without retraining the model. By post-hoc learning latent constraints, value functions that...
متن کاملChallenges with Variational Autoencoders for Text
We study variational autoencoders for text data to build a generative model that can be used to conditionally generate text. We introduce a mutual information criterion to encourage the model to put semantic information into the latent representation, and compare its efficacy with other tricks explored in literature such as KL divergence cost annealing and word dropout. We compare the log-likel...
متن کاملConditioning of three-dimensional generative adversarial networks for pore and reservoir-scale models
Geostatistical modeling of petrophysical properties is a key step in modern integrated oil and gas reservoir studies. Recently, generative adversarial networks (GAN) have been shown to be a successful method for generating unconditional simulations of poreand reservoir-scale models. This contribution leverages the differentiable nature of neural networks to extend GANs to the conditional simula...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2018